4 research outputs found

    Season-invariant GNSS-denied visual localization for UAVs

    Full text link
    Localization without Global Navigation Satellite Systems (GNSS) is a critical functionality in autonomous operations of unmanned aerial vehicles (UAVs). Vision-based localization on a known map can be an effective solution, but it is burdened by two main problems: places have different appearance depending on weather and season, and the perspective discrepancy between the UAV camera image and the map make matching hard. In this work, we propose a localization solution relying on matching of UAV camera images to georeferenced orthophotos with a trained convolutional neural network model that is invariant to significant seasonal appearance difference (winter-summer) between the camera image and map. We compare the convergence speed and localization accuracy of our solution to six reference methods. The results show major improvements with respect to reference methods, especially under high seasonal variation. We finally demonstrate the ability of the method to successfully localize a real UAV, showing that the proposed method is robust to perspective changes.Comment: Published in IEEE Robotics and Automation Letters (Volume: 7, Issue: 4, October 2022

    LSVL: Large-scale season-invariant visual localization for UAVs

    No full text
    Funding Information: This work was supported by Business Finland project Multico (6575/31/2019) and Saab Finland Oy. We thank Olli Knuuttila for the implementation work in real-time experiments. We thank Mika Järvenpää and Kari Hautio for testing opportunities and support with implementation on the Nokia Drone Networks device. Funding Information: This work was supported by Business Finland project Multico ( 6575/31/2019 ) and Saab Finland Oy . We thank Olli Knuuttila for the implementation work in real-time experiments. We thank Mika Järvenpää and Kari Hautio for testing opportunities and support with implementation on the Nokia Drone Networks device. Publisher Copyright: © 2023 The Author(s)Localization of autonomous unmanned aerial vehicles (UAVs) relies heavily on Global Navigation Satellite Systems (GNSS), which are susceptible to interference. Especially in security applications, robust localization algorithms independent of GNSS are needed to provide dependable operations of autonomous UAVs also in interfered conditions. Typical non-GNSS visual localization approaches rely on known starting pose, work only on a small-sized map, or require known flight paths before a mission starts. We consider the problem of localization with no information on initial pose or planned flight path. We propose a solution for global visual localization on large maps, based on matching orthoprojected UAV images to satellite imagery using learned season-invariant descriptors, and test with environment sizes up to 100 km2. We show that the method is able to determine heading, latitude and longitude of the UAV at 12.6–18.7 m lateral translation error in as few as 23.2–44.4 updates from an uninformed initialization,also in situations of significant seasonal appearance difference (winter–summer) between the UAV image and the map. We evaluate the characteristics of multiple neural network architectures for generating the descriptors, and likelihood estimation methods that are able to provide fast convergence and low localization error. We also evaluate the operation of the algorithm using real UAV data and evaluate running time on a real-time embedded platform. We believe this is the first work that is able to recover the pose of an UAV at this scale and rate of convergence, while allowing significant seasonal difference between camera observations and map.Peer reviewe
    corecore